An Empirical Evaluation of Bagging and Boosting

نویسندگان

  • Richard Maclin
  • David W. Opitz
چکیده

An ensemble consists of a set of independently trained classi ers such as neural networks or decision trees whose predictions are combined when classifying novel instances Previous re search has shown that an ensemble as a whole is often more accurate than any of the single classi ers in the ensemble Bagging Breiman a and Boosting Freund Schapire are two relatively new but popular methods for produc ing ensembles In this paper we evaluate these methods using both neural networks and decision trees as our classi cation algorithms Our results clearly show two important facts The rst is that even though Bagging almost always produces a better classi er than any of its individual com ponent classi ers and is relatively impervious to over tting it does not generalize any better than a baseline neural network ensemble method The second is that Boosting is a powerful technique that can usually produce better ensembles than Bagging however it is more susceptible to noise and can quickly over t a data set

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Combining Bagging and Additive Regression

Bagging and boosting are among the most popular resampling ensemble methods that generate and combine a diversity of regression models using the same learning algorithm as base-learner. Boosting algorithms are considered stronger than bagging on noisefree data. However, there are strong empirical indications that bagging is much more robust than boosting in noisy settings. For this reason, in t...

متن کامل

Combining Bagging and Boosting

Bagging and boosting are among the most popular resampling ensemble methods that generate and combine a diversity of classifiers using the same learning algorithm for the base-classifiers. Boosting algorithms are considered stronger than bagging on noisefree data. However, there are strong empirical indications that bagging is much more robust than boosting in noisy settings. For this reason, i...

متن کامل

Improving reservoir rock classification in heterogeneous carbonates using boosting and bagging strategies: A case study of early Triassic carbonates of coastal Fars, south Iran

An accurate reservoir characterization is a crucial task for the development of quantitative geological models and reservoir simulation. In the present research work, a novel view is presented on the reservoir characterization using the advantages of thin section image analysis and intelligent classification algorithms. The proposed methodology comprises three main steps. First, four classes of...

متن کامل

Combining Bias and Variance Reduction Techniques for Regression Trees

Gradient Boosting and bagging applied to regressors can reduce the error due to bias and variance respectively. Alternatively, Stochastic Gradient Boosting (SGB) and Iterated Bagging (IB) attempt to simultaneously reduce the contribution of both bias and variance to error. We provide an extensive empirical analysis of these methods, along with two alternate bias-variance reduction approaches — ...

متن کامل

An Empirical Evaluation of Bagging and BoostingRichard

An ensemble consists of a set of independently trained classiiers (such as neural networks or decision trees) whose predictions are combined when classifying novel instances. Previous research has shown that an ensemble as a whole is often more accurate than any of the single classi-ers in the ensemble. Bagging (Breiman 1996a) and Boosting (Freund & Schapire 1996) are two relatively new but pop...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1997